Skip to main content
  • Research article
  • Open access
  • Published:

An intelligent decision supporting system for international classification of functioning, disability, and health

Abstract

Background

In recent years, the population structure in Taiwan has changed so dramatically. Based on concerns of social welfare issues, Taiwanese government began to seek principles for assessment of disability. After seven years of carefully evaluation, the World Health Organization’s International Classification of Functioning, Disability, and Health (Abbreviated to ICF) is officially adopted as Taiwan’s assessment standard while most of the assessment procedures of ICF are sophisticated, and time consuming.

In this paper, we propose a sensor based decision supporting system for ICF. Our prototype system aims to reduce the burden of medical staffs, and to assist subjects to perform the assessments.

Methods

This paper integrate multiple devices including ASUS XtionTM, temperature/acceleration/gyro sensors on Arduino, and Zigbee to measure the mobility of limbs and joints. The subject’s log of assessments is then recorded in the database so that the medical staffs can remote-monitor the co ndition of subjects immediately, and analyze the results later. Additionally, in our system, a user-friendly interface is implemented for the detection of dementia.

Results

In this paper, three experiments have been conducted for different purpose. The experiment was conducted to compare the variation between thermometer and our device. Moreover, we invited 20 elders aged for 65 to 80 to use our system and all of them gave positive feedback. Two elders were invited to perform full assessment for dementia and the results show that both of them didn’t have sign of dementia. Also, the assessment of joint movement was performed by a 67 year-old elder and the result shows that the elder had well physical function and could take care of daily life.

Conclusions

The proposed system has potential for aiding users to perform the ICF testing better and provide benefits to medical staffs and society. With current technology, integration between sensor network systems and artificial intelligence approaches will more and more important. We develop a simple interface for user to manipulate and perform the ICF assessment. In addition, the early detection of dementia likely has the potential to provide patients with an increased level of precaution, which may improve quality of life.

Introduction

“People with Disabilities Rights Protection Act” was revised and promulgated in 2007, also new disability classification assessment method and applying disability certification process were scheduled to practice on July 11, 2012 (Directorate-General for the Information Society and Media 2010). People who are physically challenged can apply for the social welfare services according to the results of disabilities assessment.

At the same time, Taiwan is facing the impacts of “aging society with fewer children”, including demographic imbalance, dependency burden, and lack of elderly nursing care services. Based on concerns about these social welfare issues, Taiwanese government began to seek principles for assessment of disability.

After seven years of carefully evaluation, as a result, the World Health Organization’s International Classification of Functioning, Disability, and Health (Abbreviated to ICF) is officially adopted as Taiwan’s assessment standard.

For this new international standard, WHO changed the classification from 16 items in the previous policies to 8 coding system (Directorate-General for the Information Society and Media 2010; World Health Organization 2001; World Health Organization 2007). The old assessment methods and service applying processes are also modified by WHO. One of the most important change in the new policies is the evaluation results are no more available for the entire lifetime. From now on, several aspects of lives have to be investigated at least once every five years.

With these changes, not only the methods that we used to apply social welfare service and attend disabilities assessment are transformed, but also the government is forced to establish new follow up social welfare service, evaluation index, evaluation tool, evaluation flow, etc. Though ICF provides a better system for evaluating disability in accordance with systematic regulations, it takes both people with disabilities and medical staffs more time to perform and complete the evaluation (Liao and Huang 2009; Shotton et al. 2011).

Due to the serious lack of human resource of medicine in Taiwan, in this paper, we propose a sensor based decision supporting system for ICF. At the moment, we focus on the part of assessment for disability and dementia. We developed the friendly interface which leads users to perform the actions required for ICF assessments according to Mobility of joint functions (ID b701). In order to detect users’ movement, this project use Xtion PRO Live to measure and assess subject’s actions and activities; combined with OPENI2 and NiTE2 to conduct the structure of skeleton. We also integrated multiple devices including ASUS Xtion™, Zigbee, and temperature/acceleration/gyro sensors on Arduino as wearable device to capture users’ movement. There are many reports on using Kinect, Xtion PRO LIVE and other related devices for the purpose of motion capture (Mulvenna and Nugent 2010). There are also many software developing tools available for implementing interfaces using these devices. Subject can easily wear our multi-sensors monitoring device on the wrist like wearing a bracelet during the movement assessment test. This wearable device combines temperature, acceleration, and gyro sensors with Arduino. The temperature sensor measures subject’s skin temperature on wrist vein so that inspector can monitor subject’s health state during the test. Besides, the device includes gyroscope to detect the rotations of the wrist for testing items and it can assure that the subject raising their hands in correct orientation. We also implement accelerator to check whether the subject’s movement is smooth or not. If the subject finishes the testing item but with uneven movement, the subject might have potential joint disease. The observation of movement can be used as meaningful information of health.

A dementia assessment interface with gesture recognizer on virtual button is developed to enable users to perform the test under a more convenient and comfortable condition. This project use Webcam combining with OpenCV to develop this interactive GUI interface. We designed the dementia questions according to the items referring to dementia according to Orientation functions (ID b114), Intellectual functions (ID b117), Memory functions (ID b144), and Higher-level cognitive functions (ID 164) in ICF. In this project we assume that subjects all have the basic abilities to answer the question and to recognize words so that subjects can do dementia assessment test by themselves.

For the purpose of releasing the burden of medical staffs and increasing efficiency of evaluation work of ICF, our system collects the subjects’ test results in a database so that the medical staffs can keep up-to-date with subjects’ condition through mobile phones or Internet (Eriksson et al. 2005; Morris 1993).

In the current version of prototype system, we still need a trained volunteer managing the movement assessment assistance interface on the computer to assist subjects to do the test.

Background

As a classification, ICF systematically groups different domains for a person in a given health condition within two parts and each part with two components.

Part one Functioning and Disability has the component (a) Body Functions and Structures, (b) Activities and Participation. Part two Contextual Factors has the component (c) Environmental Factors, (d) Personal Factors.

In this project we implement a method to calculate the degree of joint movement to measure the activity of body movement. As for questions in dementia assessment test are design by us according to part which don’t need medical staffs ask subjects in person from Clinical Dementia Rating (abbreviate CDR).

Skeleton tracking

In this project, we use Xtion PRO LIVE as a sensor to track skeleton. Xtion PRO LIVE is made by ASUS uses infrared sensors, adaptive depth detection technology, color image sensing and audio stream to capture a users’ real-time image, movement, and voice, making user tracking more precise (Figure 1). We use visual C++ to develop the joint detection.

Figure 1
figure 1

Xtion PRO live.

OpenNI defined 24 joints for a person but the NITE2 offers tracking of 15 of them.

Dementia assessment test

The Clinical Dementia Rating or CDR was developed at the Memory and Aging Project at Washington University School of Medicine in 1979 for the evaluation of staging severity of dementia (Reed and Bufka 2006). The Clinical Dementia Rating is a five-point scale in which CDR-0 connotes no cognitive impairment, and then the remaining four points are for various stages of dementia:

  • CDR-0.5=very mild dementia

  • CDR-1=mild

  • CDR-2=moderate

  • CDR-3=severe

There are six aspects in Clinical Dementia Rating scale. They are memory, orientation, judgment and problem solving, home and hobbies and community affairs (Muilder and Stappers 2009). Moreover, Dementia assessment according to the items in ICF are: Orientation functions (ID b114), Intellectual functions (ID b117), Memory functions (ID b144), and Higher-level cognitive functions (ID 164). We designed the questions referring to above aspects except for home and hobbies and community affairs in Clinical Dementia Rating scale. Caregivers would deal with the questions consisting of the left aspects.

Temperature sensing

Normal human body temperature depends upon the place in the body at which the measurement is made, and the time of day and level of activity of the person. Temperatures cycle regularly up and down through the day.

TMP36 is a low voltage, precision centigrade temperature sensor (Figure 2). It provides a voltage output that is linearly proportional to the Celsius temperature. The output voltage can be converted to temperature easily using the scale factor of 10 mV/°C. Arduino is a single-board microcontroller to make using electronics in multidisciplinary projects more accessible. In this project we develop this temperature sensing device by using open source Arduino software. We use TMP36 as the sensor and combine it with Arduino to sensor the temperature via wrist vein.

Figure 2
figure 2

TMP36.

Gyroscope

In this paper we implement L3G4200D as the sensor to detect the rotation of wrist (Figure 3). It is a 3 Axis ultra-stable digital MEMS motion sensor. It gives stable sensitivity over temperature and time. L3G4200D module features an on board low drop out voltage regulator which takes input supply in the range of 3.6 V to 6 V DC. The L3G4200D has user selectable full scale of ±250, ±500, ±2000 degrees per second.

Figure 3
figure 3

L3G4200D.

Accelerator

The MMA7455L is a Digital Output, low power, low profile capacitive micro-machined accelerometer featuring pulse detect for quick motion detection (Figure 4).

Figure 4
figure 4

MMA7455L.

0 g offset and sensitivity are factory set and require no external devices. The 0 g offset can be customer calibrated using assigned 0 g registers and g-Select which allows for three kinds of acceleration selection ranges (2 g/4 g/8 g). It is suitable for handheld battery powered electronics because MMA7455L includes a Standby Mode.

Wireless technology

ZigBee is the latest wireless technology using high level communication protocol to create personal network area (Figure 5). Each ZigBee module can transmit and receive data from the other ZigBee module. These modules allow reliable communication between micro-controllers and computers. We use XCTU to manage and configure ZigBee module. It’s a free and multiple platform application designed to enable developers to interact with Zigbee.

Figure 5
figure 5

XBee - Series 2.

Methods

A user-friendly interface was developed for the ICF assessment in this paper. The system is based on multi-sensor technology which integrate multiple devices including ASUS XtionTM, temperature/acceleration/gyro sensors. And we invited subjects to perform the assessment through our system. The subject’s log of assessments is then recorded in the database. The log includes subject’s personal information, assessment result and medical history. Our system is likely to extract useful information for medical staffs and support decision making. Written informed consent was obtained for the publication of this report and any accompanying images.

System architecture

As shown in Figure 6, we have implemented two main assessment functions in our prototype system. These functions include: joints mobility measurement, dementia progression assessment.

Figure 6
figure 6

System architecture.

The first process of our system is joints mobility measurement. In the beginning of the joints mobility measurement, in order to let Xtion PRO LIVE detect the subject successfully, the subject needs to stand in the restricted position. After detecting successfully, the subject can start the measurement. In addition, during the assessment, the subject is asked to wear our sensing device on his/her wrist for detecting subject’s body temperature. There are three sensors installed on this device, including a body temperature sensor, an accelerator sensor, and a gyroscope sensor. After the subject completes a designated posture, the inspector can keep going on next movement. After all testing movements are completed, the results are then recorded into the database.

As for dementia progression assessment, our system picks out 10 questions from 100 designed questions in the database randomly. Our system implement a multimodal interface for subjects to input their answers either by using a traditional keyboard-mouse combination, or by using gestures directly with virtual push-buttons on screen. The reason why we implemented two methods for users to manipulate the system is because we are afraid that the elder are unable to control the mouse properly. Instead of mouse, we think manipulate by gesture is a more instinct way for the elder.

Finally, Medical staffs can query and check the assessment results through their mobile devices or computers by requesting the database.

Results

Joints mobility measurement

With the purpose of acknowledging subject’s body’s condition, the system will take subject’s whole body picture through the webcam on Xtion PRO LIVE while beginning the measurement. Next, the system will show the video of the measurement item on the interface of joints mobility measurement to instruct users what to do directly instead of words expression or oral direction (Figure 7). As for the instruction video, we previously recorded the demonstration animation so that the subject can imitate the movement in the animation to finish the testing items (Figure 8). We use Xtion PRO LIVE for joint skeleton tracking then calculate whether user’s joint movement reach the standard degree or not. If users can’t reach the standard degree within 60 seconds, the system would record the latest detected extension and continue to next measurement item.

Figure 7
figure 7

Joints mobility measurement interface. (Figure translation: (1) 請跟隨影片做動作: Please follow the video to perform the test movement, (2) 右關節測試: Assessment of Right Joints, (3)左關節測試: Assessment of Left Joints, (4) 右肩關節伸展結果: The result of right shoulder extension, (5) 右手軸彎曲結果: The result of right elbow extension, (6) 右腳後跟伸結果: The result of right hip flexion, (7) 右髖關節測試結果: The result of right hip extension, (8) 右膝關節測試結果: The result of right knee extension).

Figure 8
figure 8

Recording demonstration.

Joints mobility measurement comprises two parts as following: upper extremity and lower extremity.

According to the items from Mobility of joint functions (ID b701) in ICF, we pick out the relative measurement testing items as below:

Upper extremity

  • Mobility of shoulder

    Range of motion at the shoulder extension

    (Standard degree: 180)

    Range of motion at the shoulder flexion

    (Standard degree: 60)

    Joint function (standard degree: 240).

  • Mobility of elbow

    Range of motion at the elbow extension

    (Standard degree: 145)

    Range of motion at the elbow flexion

    (Standard degree: 0)

    Joint function (standard degree: 145)

  • Mobility of wrist

    Range of motion at the wrist extension

    (Standard degree: 80)

    Range of motion at the wrist flexion

    (Standard degree: 70)

    Joint function (standard degree: 150)

Lower extremity

  • Mobility of hip

    Range of motion at the hip extension

    (Standard degree: 125)

    Range of motion at the hip flexion

    (Standard degree: 10)

    Joint function (standard degree: 135)

  • Mobility of knee

    Range of motion at the knee extension

    (Standard degree: 145)

    Range of motion at the knee flexion

    (Standard degree: 0)

    Joint function (standard degree: 140)

In this paper, our system was performed by a 67 year-old elder. Through our system, the subject’s mobility of joints could be measured and quantified as shown in Table 1. According to this result, this elder had well physical function and could take care of daily life.

Table 1 The joints mobility measurement result of 67 year-old subject

Dementia assessment interface

The dementia assessment interface is based on gesture recognition (Figure 9). Users can choose the answers through pressing the virtual button on the screen or just using the mouse. It provides two simple and easy methods for users to manipulate the system. We design the questions and options according to the different aspect in Clinical Dementia Rating scale respectively and the referring items from ICF. The system pick out 10 questions from 100 designed questions in the database randomly. The questions are all multiple choice which subjects choose one answer from four. The sequence of questions and options are randomly to prevent subjects from memorizing the answer from doing the test last time. Our system was tested by 20 elders aged from 65 to 80 under the help of volunteers (Figure 10). All of them feel that using our system is not only interesting but also fascinating. Their reactions show that our system arouses their interest and increase their willingness to do dementia test.

Figure 9
figure 9

Dementia assessment interface. (Figure Translation: (1) 今天是星期幾? : What day is today? (2) 一: Monday, (3) 三: Wednesday, (4) 二: Tuesday, (5) 五: Friday).

Figure 10
figure 10

The elder doing the dementia test.

In order to improve the accuracy of our system, full examination was performed by two elders aged 67 and 77 respectively. And their family, as caregiver, was asked to answer the AD8 questionnaire which is the version for caregiver. All the questions are based on daily life, for example, the color judgment, shape recognition, and simple calculation. We couldn’t determine if the subject have dementia by merely one test. The assessment factor should consist of assessment test, the questionnaire for caregiver and the judgment by medical staffs. The main purpose of our system is preliminary assessment and it’s clearer for medical staff to understand subject’s situation. The standard of high risk population is that if answer’s correctness percentage is lower 80% and the questionnaire answered by caregiver got low score, the subject need to be kept under observation.

From Dementia Assessment, we could learn the subject’s ability in different aspects. During the process, we found that it was helpful for the subject to keep calm and focus on the assessment when their family was sitting around. According to the result and the questionnaire answered by their family, both of subjects didn’t have the sign of Dementia (Figure 11).

Figure 11
figure 11

Correctness ratio of dementia assessment.

Multisensor device

As previously mentioned, we integrated temperature/acceleration/gyro sensors on Arduino which we developed the device with open source Arduino software. We implemented Zigbee to detect users’ body temperature and sophisticated movement during the test. The detail prototype would be described in this session respectively. For the convenience of measurement, these devices were integrated into a sensing module for users to wear on the wrist so that won’t cause their uneasiness.

Temperature sensing device

The temperature sensing circuit is based on TMP36 and Arduino UNO (Figure 12). Owing to the device without switch, the LED on Arduino board can make sure whether the device is on or not. The resistance prevents electric current from passing through the LED and damaging the LED.

Figure 12
figure 12

Temperature sensing circuit.

The system would read the voltage on the Vout pin shown in Figure 13, the output voltage can be converted to temperature easily using the scale factor of 10 mV/°C (Eq.1]).

Figure 13
figure 13

Pins of TMP36.

$$ \mathrm{Temperature}\ \mathrm{in}\ {}^{\circ}\mathrm{C}=\left[\left(\mathrm{Voutin}\ \mathrm{mV}\right)-500\right]/10 $$
(1)

In order to show the result on the system, we connect Arduino and computer via USB and use the 5 V power on Arduino as power. We connect Vout on TMP36 to Analog IN Pin0 on Arduino UNO so that the device does not need other external power supply (Figure 14).

Figure 14
figure 14

Temperature sensing device.

Subject can wear this device to record the temperature variation and which testing item will cause the variation most during the joints mobility measurement (Figure 15). If the temperature changes higher than 0.5 degree the system will record the time and temperature into the database so that the medical staffs can acknowledge subject’s health state during the measurement (Figure 16).

Figure 15
figure 15

Wearable device.

Figure 16
figure 16

Detecting temperature via device.

In this project, we focus more on understanding the variation of subject’s body temperature rather than the accuracy of degree. In order to prove that thermometer can be substituted by our device, we compare the result through our device with the result form thermometer when the indoor temperature was 26°C. The comparison result shows that the device’s variation is consists with thermometer’s variation with the stable average difference of 1.88°C (Figure 17). Thus, Subject can wear our device instead of using the thermometer.

Figure 17
figure 17

Temperature comparison. This is the comparison between the result measured by our device and the result measured by thermometer.

Another experiment is designed to compare the differences between a male and a female user while they were doing indoor bicycling exercise at temperature 26°C. According to the result, we acknowledged that female’s body temperature was normally higher under the same environmental condition and body temperature roses gradually (Figure 18). Through observing the variation of body temperature, we could know subject’s body condition and the medical staffs can analyze this information to know more about subject’s health condition.

Figure 18
figure 18

Body temperature of a male and a female during Exercising. We recorded the temperature of a man and a female to observe the variation.

In this paper, in order to observe the sophisticated movement of wrist, we use L3G4200D to detect the rotation of wrist. And moreover, to observe if the subject’s movement is fluently or not, our system use MMA7455L as the sensor to detect the rate of subject’s movement.

We let user wear it with the temperature sensor so that the device can also detect the wrist motion regarding the test item of wrist extension and flexion (Figure 19).

Figure 19
figure 19

Wearable wrist detection device.

During the test, the purpose of doing these tests is not only finishing the test but also observing the health condition of the elder. The elder might have potential joints problems if they reach the standard movement with uneven movement (Figure 20).

Figure 20
figure 20

Wearable rate detection device.

Query interface on the mobile and on the computer

We developed the query interface on the mobile and on the computer which can connect to the database and access the assessment result so that medical staffs can query the results immediately. To enter the system, medical staffs need to create an account. After entering the system, if subjects’ information has not been established yet, medical staffs need to create subjects’ basic information for the sake of convenience querying. Then the assessment can start after establishing this information.

Functions on the mobile and computer interface mentioned above are listed as follows:

  • Connect to the database

  • Medical staffs can register accounts (Figure 21)

  • Build subject’s basic information (Figure 22)

  • Query subject’s information by identify ID (Figure 23)

  • Record the result of the test

  • Query the results of test (Figures 24 and 25).

Figure 21
figure 21

Create a medical staff’s account. (Figure Translation: (1) 帳號: Account, (2) 密碼: Password, (3) 姓名: Name, (4) 性別: Gender, (5) 電話: Telephone, (6) 地址: Address, (7) 員工ID: Staff ID, (8) 所屬醫院: Institution, (9) 所屬科別: Hospital Department).

Figure 22
figure 22

Query subject’s information. (Figure Translation: (1) 請輸入身分證字號進行查詢: Please enter Identity card ID to search subject, (2) 病歷號碼: Record Number, (3) 姓名: Name, (4) 所屬醫院 Name of Hospital).

Figure 23
figure 23

Build subject’s personal information. (Figure Translation: (1) 姓名: Name, (2) 性別: Gender, (3) 血型: Blood type, (4) 生日: Date of Birth, (5) 病歷號碼: Record Number, (6) 所屬醫院: Name of Hospital, (7) 身分證字號: Identity Card ID, (8) 電話: Telephone, (9) 地址: Address, (10) 藥物過敏 Allergies).

Figure 24
figure 24

Query results of temperature detection. (Figure Translation: (1) 溫度檢測結果: The result of temperature detecting, (2) 受測日期: Date, (3) 所屬醫院: Name of Hospital, (4) 通訊埠Port, (5) 受測者姓名: Subject’s Name, (6) 病歷號碼 Record ID, (7) 傳輸速率 Baud rate, (8) 溫度測試結果 The result of temperature detection).

Figure 25
figure 25

Query result of joints mobility assessment. (Figure Translation: (1) 查詢動作檢測結果: Query the result of joints mobility assessment, (2) 受測日期: Date, (3) 所屬醫院: Name of Hospital, (4) 受測者姓名: Subject’s Name, (5) 病歷號碼 Record Number, (6) 動作測試結果: The result of joints mobility assessment, (7) 右肩關節伸展結果: The result of right shoulder extension, (8) 右手軸彎曲結果: The result of right elbow extension, (9) 右腳後跟伸結果: The result of right hip flexion, (10) 右髖關節測試結果: The result of right hip extension.

Discussion

In the joint mobility measurement, subject was able to perform the measurement according to the demonstration animation and the degree of joint mobility was recorded in database. Through the subject’s moment we could observe not only if he/she has the ability to take care of themselves but also the inconvenience might occur in everyday life. The experiment results show that our system has potential for enabling the ICF assessment to be performed by users themselves or with assistance from family, and therefore releasing medical staffs’ burden.

In the experiment, we found that the elder could recollect their memories by asking question about family affairs. During dementia test, when the subjects were asked about their date of birth, they both incidentally mentioned funny stories in their childhood. Subject 1 is a traditional farmer living with family. Subject 2 is subject 1’s wife who is a housekeeper Table 2. Both of them were confused when they were asked about “What date is today?”, though they could remember “What date is today?” From the results, we could tell that subject1 was good at calculating because he could count the price correctly for harvest. In contrast, Subject 2’s ability of calculating was not as good as Subject 1. Though subject 2 performed badly at calculation we figured out that it was caused by less use of calculating in life. The questionnaire answered by their family/caregiver is also an important information for decision supporting. Through our system, early detection of dementia might be found and possibly provides patients with benefit from better treatment option and may improve patients and their family’s life quality.

Table 2 Subject’s information

Conclusion

Our research results contain the following three parts: First, our system can successfully track user’s movement. And consequently, most of the measurement results, which was originally performed by subject with the assistant of medical staffs, can now be recorded automatically under the help of volunteer. Therefore our system is able to release medical staffs’ burden. Second, our system can measure and visualize user’s body temperature automatically. In addition, through our graphical interface, helpers and medical staffs can acknowledge auxiliary information of subject’s body condition more easily than the conventional way. Third, our system provides a user-friendly multimodal interface for elders, such as the virtual buttons with webcam-based gesture recognizer, etc.

For the current version, we still have lots of work to do. In the part of joint mobility measurement, we are adding a voice guide to encourage users while performing assessment. Also, we are also implementing a wire-less sensor network so that we can overcome some measuring problems for sophisticated joints, such as ankle and wrist.

Moreover, understanding the massage presented by the computer is not always easy, especially for the elderly. Therefore, we would like to develop a more flexible and customizable interface which can provide necessary services according to the user’s need by introducing more AI approaches (Figure 26).

Figure 26
figure 26

Future work.

Comparison of the joint mobility measurement results of the system and those done by an expert as gold standard is suggested for validation. We may need a lot of subjects to have the statistics of P-values. The questions about dementia are even more complicated. We may need comparison between the answers by subjects using computer and the answers by the expert. We are planning to test our system on more patients with the help of medical staffs in the near future. Finally, being encouraged by the positive feedbacks from both medical staffs and elders, we are also planning to further develop our system as a more general platform for visualizing medical metadata.

In this paper, we described an ongoing project of a sensor-based decision supporting system for ICF. Many parts of this project are still at its early stage. However, the first goal which aims to develop a conceptual prototype by utilizing the advices from medical staffs, is reported in this paper. And the requirements for the research of this paper in the medical field are higher and higher.

References

  • Directorate-General for the Information Society and Media. (2010). Advancing and applying Living Lab methodologies: an update on Living Labs for user-driven open innovation in the ICT domain. European Union: European Network of Living Labs publication.

    Google Scholar 

  • Eriksson, M, Niitamo, V-P, & Kulkki, S. (2005). State-of-the-art in utilizing Living Labs approach to user-centric ICT innovation–a European approach. Technology 13, 1, 1–13.

    Google Scholar 

  • Liao, H-F, & Huang, A-W. (2009). Introduction to International Classification of Functioning, Disability and Health (ICF) and recommendations for its application in disability evaluations in Taiwan. Formosan Journal of Physical Therapy, 34(5), 310–318.

    Google Scholar 

  • Morris, JC. (1993). The Clinical Dementia Rating (CDR): current version and scoring rules. Neurology, 43, 2412–2414.

    Article  Google Scholar 

  • Muilder, I, & Stappers, PJ. (2009). Co-creating in practice: result and challenges, collaborative innovation: emerging technologies, environments and communities (pp. 22–24).

    Google Scholar 

  • Mulvenna, MD, & Nugent, CD. (2010). Supporting people with dementia using pervasive health technologies. Berlin: Springer.

    Book  Google Scholar 

  • Reed, G, Bufka, L. (2006). Using the ICF in Clinical Practice, Annual North American Collaborating Center Conference on ICF; 12, [Online] http://www.wcpt.org/sites/wcpt.org/files/files/KN-ICF-Clinical_practice.pdf.

  • Shotton, J, Fitzgibbon, A, Cook, M, Sharp, T, Finocchio, M, & Moore, R. (2011). Real-time human pose recognition in parts from single depth images. Computer Vision and Pattern Recognition (CVPR), 2, 1297–1304.

    Google Scholar 

  • World Health Organization. (2001). International classification of functioning, disability and health: ICF-CY. Geneva: World Health Organization.

    Google Scholar 

  • World Health Organization. (2007). International classification of functioning, disability and health: children and youth version: ICF-CY. World Health Organization.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lieu-Hen Chen.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

LHC, WEH and HMH participated in the sequence alignment and drafted the manuscript. YWC gave the professional suggestion as medical staff. EHW gave conceptual advice of medical mobile device. ESS and TY gave the conceptual advice of Quality of life. YT gave the suggestion for visualization in user’s daily log. All authors read and approved the final manuscript.

Rights and permissions

Open Access  This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hsieh, WF., Chen, LH., Hung, HM. et al. An intelligent decision supporting system for international classification of functioning, disability, and health. Vis. in Eng. 3, 11 (2015). https://doi.org/10.1186/s40327-015-0023-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40327-015-0023-5

Keywords