In the context of tumor research, personalized medical treatments are very expensive and not widely accessible. One aspect of the research involves isolating samples from a biopsy to test several treatments on the same tumor. This project aims to develop a robotic platform capable of selecting, moving, and culturing these different samples to streamline the resulting research and make the technique more affordable.
The project use the frame from a 3D printer, an anycubic mega zero with it's controller. A webcam is used in order to detect and track the samples and three dynamixel xl430 are used to actuate two micro-pipette in order to act like a pneumatic gripper and to automate the manipulation of liquids.. The next illustrations represent the frame of the robot, the end effextor with the camera and the two pipette tips and the actuation of the pipette.
One of the tips is used exclusively for moving the samplesm while the second one is used to mix and place the gel matrix with the nutriments in order to keep the samples alive. Both tips are selectable using an actuator and are connected to individual micropipettes, ensuring a precision of ±3uL while allowing simultaneous use of both tips.
One of the main constraint of this project was to be able to detect and select the tissue samples. The webcam placed at the end effector of the robot makes it possible. The camera gives the position of the particles and it is also used to check if the samples are correctly picked up by the tip of the pipette.
The first challenge was to guarantie a high precision, under the milimeter scale for the evaluation of the position of the particles. The results on the tests patterns are very demonstratives and show a high repetability, the pattern here is used with a needle on the end effector to perforate the sheet.
The second aspect is to detect and select the different samples taking into account their size and shape to ensure more representative results.
<img src= width=400>
In order to increase the reliability of the platform a second camera is used to validate the catch. A macro camera is used on the side to evaluate the tip.
To assess the images, a convolutional neural netword as been trained. The role of the network is to classify the images while been very robust to the differents shapes of samples and lighting conditions. The tests shows a very high reliability, over 98% and a rate of false positives close to zero.
The platform as been tested on three diffrents types of samples. The differents results as been collected on mouse samples, spleen, kidney and colon tissues. This samples gives a good ideas of the capabilities of the robot.
comming soon...