Autonomous line follower robot controlled by cell culture
AAutonomous line follower robot controlled by cellculture
Sayan Biswas
Department of Electrical EngineeringJadavpur University, Kolkata, IndiaEmail: [email protected]
Abstract —Neuro-electronic hybrid promises to bring up amodel architecture for computing. Such computing architecturecould help to bring the power of biological connection andelectronic circuits together for better computing paradigm. Suchparadigms for solving real world tasks with higher accuracy ison demand now. A robot as a autonomous system is modeledhere to navigate following a particular line. Sensory inputs fromrobot is directed as input to the cell culture in response to whichmotor commands are generated from the culture.
Index Terms —Neuro-electronic hybrid, biological connection,electronic circuits, autonomous system, line follower, sensoryinputs, motor commands
I. I
NTRODUCTION
Computation carried by brain are fast and they are ableto take rapid decisions. At times brain can take really fastdecision at near zero reflex time. Although all kind of decisionmaking does not fall into this category, but obviously thereare instances where one can take decision in near zero reflextime. In a visual search task to discriminate between a carimage and animal image, near zero reflex time is expected.Decision making regarding colours whether is black or whitestakes almost zero reflex time. Such computation are carriedby brain real fast. Such abilities of the brain have inspiredresearchers. Many strategies have been developed inspiredfrom such abilities of brain. Some of those strategies areneuromorphic devices [1] [2] [3], artificial neural network [4],fuzzy systems [5] [6].An ongoing research is being carried out for making hardwaressuitable to mimic neuronal systems [1]. There are challengesincluding versatility connectivity which limits the usage of cul-tures like brain. Neuro electronic hybrid systems can prove tobe a possible computing architecture. On connecting culturesof neurons properly with the real world such abilities of thebrain can be used to solve real world problems. This mightprove to be a essential platform to use biological network forcomputing. Neurons are basic computation and functional unitof brain which are versatile in terms of behavior. The neuronsin the brain form inter connected set of networks which aredynamic in nature. Such dynamic nature of neuron in networkgives them learning abilities and makes them powerful enoughto serve varied types of function. Hence making the brain arobust computational unit of brain. A hybrid system attemptsto solve real world problems by implementing this abilities ofbrain. MEA dish proves advantageous and promising for under-standing network activity and properties of neuronal network.Robotic control through such culture [7] [8] and training [9] ofneuronal network is an ongoing work at various labs. There areseveral challenges required to be resolved for a practical andreal world application of such systems [10]. Decoding outputfrom an input to a network is open problem. To understandsuch decoding pattern a deeper understanding of dynamicnetwork system [11] [12] [13] [14] [15] is essential.The description here deals with a framework for using cellcultures as a information processing tool, and to implementthe ability for the purpose of classification of sensory inputsreceived and accordingly take decision of generating correctmotor commands. The framework shall elaborately explain aproposal for implementation in real life.II. L
INE FOLLOWER
A line follower robot is an autonomous body expected tonavigate in a network by following a specific line. The track onwhich the robot is expected to navigate is coloured black, andthe background is white (figure 2). Such devices are controlledby a person using a visual feedback from the scene of thetrack which is used to generate respective motor commandson the controller to control the robot. This involves multisensory perception to compute the action required to be takenfrom the visual input and act accordingly by sending motorcommands to the fingers to produce the desired effect on therobot navigating on the track. As this requires intervention incontrol from a human this can’t be refereed to as a automatedsystem. The process is as shown in the figure 1.Fig. 1: Robot Navigation by HumanSeveral computer vision [16] [17] along with machine lean-ing [18] paradigm has been successful to implement the same a r X i v : . [ q - b i o . N C ] F e b ig. 2: TRACKby making silicon chips to perform the computation that areto be performed by the brain. There are cameras on the robotwhich are performing the role of human eyes. The images fromthe camera gives some features regarding the path on whichthe robot is navigating. The direction on which the robot mustmove is classified based on the features extraction method andusing a machine learning classifier. Once this is known, motorcommands to make the robot move in desired direction is sentto the wheels. Hence the silicon chips and camera do the workof eye and multi sensory perception [19], of relating visualfeedback to generate motor commands. Using computer visiontechniques for extracting visual features followed by machinelearning classifier to generate motor commands is shown inthe figure 3.As specified the aim is to implement cell culture to make thecontrolling of robot automated.III. C ELL CULTURES AS CONTROLLING UNIT
Multi electrode array - MEA [20] recording gives a greatopportunity to study network topology as it provides popu-lation recording. It provides data from multiple electrodes orchannels which is thus effective for understanding networkevents. A MEA has culture on its surface from which electricalactivity is recorded. A framework is modeled for using theelectrical activity recording to control a robot to navigatefollowing a line.Involvement of cell as a contriving unit will be done byfollowing: • By photo receptors cells • By neural cells. These two shall be dealt in following section.IV. P
HOTO RECEPTOR CELLS
Photo receptor cells are specialized cells in retina whichperforms photo transduction. Visual photo transduction isthe method of converting light into electrical signals. Photoreceptors are of great importance as they convert light into sig-nals which stimulate biological processes. These cells absorbphotons that causes a change in the cell’s membrane potential.The model proposes usage of such cell culture on MEA dishfor controlling a line follower.The robot would be a simple system equipped with a camerato serve the purpose of visual feedback. As the camera takesup the current position of the robot, it is to be projected onthe MEA dish containing photo receptor cell cultures. Theusage of camera serves the purpose of eye. The projectionof the visual field is shown in the figure 5. The track (figure2) if followed carefully one can find there are three possibledifferent visual field as shown in figure 6. The photo recep-tors respond by different electrical activity to different lightintensities. Hence the projection of different visual field onthe culture is expected to generate different electrical activityMEA dish. Hence to achieve control using photo receptor cellculture [21] it requires a mapping of electrical activity to visualfield projected. This mapping would help to understand thevisual field or current location of robot with the electricalactivity recorded which would in turn help in generating motorcommands. After the culture is done three of this possiblevisual field would be projected one by one, and correspondingelectrical activity generated would be recorded. This will helpig. 3: Robot Navigation by Machine Learning classifierTABLE I: Motor Commands as per the Visual Field
Visual Field Motor CommandsVisual Field A Cover offset and take 90 ◦ leftVisual Field B Go straightVisual Field C Cover offset and take 90 ◦ right in achieving the required mapping of visual field and electricalactivity mapping. Once the mapping is achieved the robotequipped with camera is left to be navigate on path. It wouldtake visual field project it on the culture followed by whichelectrical activity shall be recorded based on which propermotor command would be generated and sent to the robot.The methodology is as follows: • Visual fields are projected on to the cell culture andmapping of electrical activity and visual field is obtained • During test, the images are projected which will generatemotor commands as per obtained mappingAs per positioning of the camera in the experiment an offsethas to determined. The offset would prevent the robot to takedesired motor action as soon as visual field is detected. Offsetis shown in figure 4. If the robot would take say left turn onattending visual field A or electrical activity correspondingto visual field A, it would go off track out of line. Thisdemands necessity of having a pre determined offset. Hencemotor commands required to be generated as per the visualfield is given in table I. The procedure is illustrated in figure7. V. N
EURON C ELL
Neuron cultures obtained generally from rat brain are cul-tured on MEA dish can be implemented as a control unitfor making autonomous navigator. Framework that will bediscussed here was proposed earlier [10] for a autonomous Fig. 4: OffsetFig. 5: Projection of the visual field captured through cameraon top of the MEA dish containing photoreceptor cells a) Visual field - A (b) Visual field - B (c) Visual field - C
Fig. 6: Visual field projected on MEA dishFig. 7: Automated robot navigation by photo receptor cell (a) Sensor position - A (b) Sensor position - B (c) Sensor position - C
Fig. 8: Sensor position of robot as it navigates. Blue circles denotes location of sensor.ig. 9: Automated robot navigation by neural cellFig. 10: Result of left turn from sensor position A - figure 8a.Blue circles denotes location of sensor.system of navigating by avoiding obstacles. The proposedframework had a accuracy about 98%. It would be discussedhow already existing framework could be utilized for modelinga line follower system.The robot which shall navigate must have two infrared - IRsensors positioned pointing downward. As the robot navigatesthere could the only three possible states of view as shownin figure 8. A IR sensor contains a emitter and detector. Theemitter would emit infrared waves and purpose of detectorwould be to detect presence of any IR wave. As white reflectslight and black absorbs it, same would be true with infraredwave. Hence if IR sensor detect (1) IR waves that meansthe colour is white and if it does not detect (0) the colourmust be black. Hence { } { } { } corresponds to gostraight, go left, go right respectively (figure 8 gives a pictorialrepresentation). Hence knowing the output of IR sensor couldpredict the desired output motor command. Activity of MEAdish is used to understand the output of IR sensor. It wasfound that probability of spike occurrence at some particularelectrode in response to stimulus sequence at some other elec- TABLE II: Motor Commands as per the Sensor position Sensor Position Motor CommandsSensor Position A Take 90 ◦ left and a little forward driftSensor Position B Go straightSensor Position C Take 90 ◦ right and a little forward drift trode depends on electrode stimulation the order of stimulationand timing between the pulses [22]. A network of neuronis well able to distinguish between various inputs based onelectrodes stimulated, the stimulating order and timing [10].This ability of neuronal network was used to encode inputs toa culture and decode output responses from the culture. Fromelectrodes that excited the network to highest level stimulatingelectrode is chosen. Stimulating and recording electrodes areso chosen that by input pattern could be predicted from activityat recording electrodes [10]. Hence the input pattern of { }{ } { } could be predicted from activity patter of neuroncultures. As this prediction could be made foreseen motorcommands could be generated making it possible for the robotnavigate following a line.Suppose the sensor position is as in figure 8a. If then robottakes a 90 ◦ left turn it results to a position described in figure10 which is a undesired position. Similarly the system wouldattain a undesired position if robot takes a 90 ◦ right turn fromposition C (figure 8c). This is undesired as one of the sensoris lying on white region where as the other is on black region.This issue could be resolved by generating a motor commandto take a 90 ◦ turn on the desired direction followed by aautomated little forward push so that both sensor come intothe white region.Hence automated navigator using neuron culture is developed.The corresponding motor commands are shown in table II. Theprocedure is illustrated in figure 9.VI. D ISCUSSION AND C ONCLUSIONS
This was a implementation model using two type of cellculture in performing real world task of navigating following aine. It was shown how could the system be made autonomousby using two different type of cell cultures. The techniques forusage of neuron culture have been implemented practicallyand was found to have a accuracy of 98% [10]. It is expectedthat if the same methodology is extended in autonomous linefollower system, as proposed, it would have a good accuracytoo. The strong idea in this approach is using cell culturesas information processing tool, and to implement the abilityfor the purpose of classification of sensory inputs receivedand accordingly take decision of generating correct motorcommands. Driving can be looked as a process of locomotionby obstacle avoidance and path following. A amalgamation ofobject avoiding paradigm [10] and line follower techniques, assuggested in thid work, could pave way for technological ad-vancement of building new efficient and accurate autonomousdriving systems. VII. A
CKNOWLEDGMENT
Author would like to thank Shefali, Department of FoodTechnology and Biochemical Engineering, Jadavpur Univer-sity, Kolkata for the valuable comments in improving themanuscript. Author would also like to thank Department ofElectrical Engineering, Jadavpur University, Kolkata.R
EFERENCES[1] M. Suri, O. Bichler, D. Querlioz, O. Cueto, L. Perniola, V. Sousa,D. Vuillaume, C. Gamrat, and B. DeSalvo, “Phase change memory assynapse for ultra-dense neuromorphic systems: Application to complexvisual pattern extraction,” in
Electron Devices Meeting (IEDM), 2011IEEE International . IEEE, 2011, pp. 4–4.[2] S. Yu, Y. Wu, R. Jeyasingh, D. Kuzum, and H.-S. P. Wong, “Anelectronic synapse device based on metal oxide resistive switchingmemory for neuromorphic computation,”
IEEE Transactions on ElectronDevices , vol. 58, no. 8, pp. 2729–2737, 2011.[3] G. Indiveri, “A neuromorphic vlsi device for implementing 2d selectiveattention systems,”
IEEE Transactions on Neural Networks , vol. 12,no. 6, pp. 1455–1463, 2001.[4] J. E. Dayhoff and J. M. DeLeo, “Artificial neural networks,”
Cancer ,vol. 91, no. S8, pp. 1615–1635, 2001.[5] D. Nauck, F. Klawonn, and R. Kruse,
Foundations of neuro-fuzzysystems . John Wiley & Sons, Inc., 1997.[6] C.-T. Lin and C. Lee,
Neural fuzzy systems: a neuro-fuzzy synergism tointelligent systems . Prentice-Hall, Inc., 1996. [7] D. J. Bakkum, Z. C. Chao, and S. M. Potter, “Spatio-temporal electricalstimuli shape behavior of an embodied cortical network in a goal-directed learning task,”
Journal of neural engineering , vol. 5, no. 3,p. 310, 2008.[8] A. Novellino, P. D’angelo, L. Cozzi, M. Chiappalone, V. Sanguineti, andS. Martinoia, “Connecting neurons to a mobile robot: an in vitro bidirec-tional neural interface,”
Computational Intelligence and Neuroscience ,vol. 2007, 2007.[9] M. E. Ruaro, P. Bonifazi, and V. Torre, “Toward the neurocomputer:image processing and pattern recognition with neuronal cultures,”
IEEETransactions on Biomedical Engineering , vol. 52, no. 3, pp. 371–383,2005.[10] J. B. George, G. M. Abraham, B. Amrutur, and S. K. Sikdar, “Robotnavigation using neuro-electronic hybrid systems,” in . IEEE, 2015, pp. 93–98.[11] S. Biswas, “Proposal for ranking nodes of neural network using activityindex,” in
Research in Computational Intelligence and CommunicationNetworks (ICRCICN), 2016 Second International Conference on . IEEE,2016, pp. 224–228.[12] S. Biswas, “Extraction of network information - quality and quantity -from nodes of neuronal network.” Accepted at 2016 fifteenth IEEEInternational Conference on Information Technology, 2016.[13] S. Biswas, “How diverse is the network information obtained from thenodes of a biological neural network ?” Accepted at 2017 EleventhIEEE International Conference on Intelligent Systems and Control, 2017.[14] S. Biswas, “Novel algorithm to spatially locate information and activityhub in biological neuronal network.” Accepted at 2017 Fourth IEEEInternational Conference on Signal Processing and Integrated Networks,2017.[15] E. Bullmore and O. Sporns, “Complex brain networks: graph theoreticalanalysis of structural and functional systems,”
Nature Reviews Neuro-science , vol. 10, no. 3, pp. 186–198, 2009.[16] D. A. Forsyth and J. Ponce,
Computer vision: a modern approach .Prentice Hall Professional Technical Reference, 2002.[17] S. E. Umbaugh,
Computer Vision and Image Processing: A PracticalApproach Using Cviptools with Cdrom . Prentice Hall PTR, 1997.[18] C. M. Bishop, “Pattern recognition,”
Machine Learning , vol. 128, 2006.[19] B. Thakur, A. Mukherjee, A. Sen, and A. Banerjee, “A dynamicalframework to relate perceptual variability with multisensory informationprocessing,”
Scientific Reports , vol. 6, 2016.[20] M. E. J. Obien, K. Deligkaris, T. Bullmann, D. J. Bakkum, and U. Frey,“Revealing neuronal function through microelectrode array recordings,”
Frontiers in neuroscience , vol. 8, p. 423, 2015.[21] T. Watanabe and M. C. Raff, “Rod photoreceptor development invitro: intrinsic properties of proliferating neuroepithelial cells changeas development proceeds in the rat retina,”
Neuron , vol. 4, no. 3, pp.461–467, 1990.[22] J. B. George, G. M. Abraham, K. Singh, S. M. Ankolekar, B. Amrutur,and S. K. Sikdar, “Input coding for neuro-electronic hybrid systems,”