This paper describes a new human-machine interface based on mouth open/close motions.User attached the developed controller with the skin surface to observe the moth open/close states.The controller equipped a tiny magnet sensor by which the slight relative displacement between theupper and lower skin point of mouth could be measured. For a higher recognition ratio, thecontroller had a flexible structure to absorb the slight displacement caused not by operation but bymastication to eat something. Next, suggested method was applied for operation of the meal supportequipment. The LED of four food trays automatically turned on and off in order at a constant period,and user could select the food on the tray by opening the mouth at a moment when the LED ontarget tray was illuminated. The manipulator scooped the selected food on the tray with the spoonand moved it to the front of the user’s mouth. As for the manipulation of scooping food, thehuman’s manipulation could be divided into five processes, and the obtained human’scharacteristics were applied for the motion planning of the meal support manipulator.