This project applies automatic methods for classification and recognition of urine analysis microscopic images. We found that, it is necessary to apply automatization in the field of microscopic analyses of urine solution as detecting particles in the microscopic image is repeated and time consuming. Furthermore, Particles in many medical laboratory analyses have irregular shapes and blur edges. This project applies medical image processing algorithms and pattern recognition to microscopic images. It is composed of three stages: first, original urinary sediment microscopic images are transformed into binary image by image preprocessing including median filtering, color image conversion to gray scale image and image segmentation. Second, we select and extract some objects from images. Third, we classify the extracted images using SVM to recognize four kinds of urine sediment components: red blood cells, white blood cells, cast, calcium oxalate.
Category Archives: Achievements
Dr, Ayman Ezzat
March 9, 2016
We are pleased to announce our five years success story Brochure, to view Click Here
Dr, Ayman Ezzat
April 18, 2015
Dr, Ayman Ezzat
November 6, 2014
HCI-LAB Interaction Group GP 2012-2013 win the first position in YIA program for the project entitled ProkiWII
The YIA Program aims to positively impact the scientific culture in Egypt, and to produce more equipped scientists in the future by helping undergraduate, graduate and PhD students gain access to training, funding, information, equipment, and supplies that may better meet the needs of their research projects at Egyptian national universities.
ProkiWII present an intuitive interaction technique for everyday objects. We propose a system that allows the user to interact with everyday objects, and use them to make smart scenarios. We enrich the objects with a portable projected interface that is supported by 3D accelerometer in order to build novel user object interaction scenarios. Moreover, we present double crossing as an interaction technique for manipulating finger interface. We conducted experiment to test the applicability of the proposed system with different applications. Our results showed that the proposed system has a lot of potentials to enrich different domains of applications like object data storage, education Montessori cubes and smart shopping goods.
Dr, Ayman Ezzat
November 6, 2014
HCI-LAB , Interaction Group win the 3rd Position of Kinect Track and being marked by Microsoft ATL for GP 2012-2013 named RemoAct: Portable Projected Interface with Hand Gesture Interaction.
RemoAct is a wearable depth sensing and projection it makes interaction with the surrounding environment more intuitive through sharing and sending data with surrounding humans by applying certain gestures. Not only can you interact with humans but you can also send commands thorough gestures to other machines in the surrounding environment like printers, display screens, and projectors. Unlike some other wearable systems you have no need to wear gloves or even have a setup forevery environment, on the contrary this system offers a mobile and robust solution for interacting using a projected surface on habitual flat surfaces or even flat organic surfaces like the user’s hand, and at the same time interact with different people and different machines in different environments without the need for any further system configurations.