Data

ACE Free-living dataset

ACE-FL-E 1.0 from IMWUT 2017 paper "Recognizing Eating from Body-Worn Sensors: Combining Free-living and Laboratory Data"

Basic features This dataset adds free-living data for a subset of the original participants in ACE (ACE-FL) and for a new cohort of participants for testing external validity (ACE-E). 110h of data is provided for 5 ACE-FL participants in two approximately 12-hour days each. 144h of data is provided for the ACE-E participants, including two days for 5 individuals, and 5 days for one individual. All participants wore an earbud microphone, and a smartwatch on each wrist while doing all daily activities, and additionally the ACE-FL participants wore Google Glass. The dataset includes ground truth of meal times as well as content of each meal, including photographs. Currently, all motion data (head, both wrists) is included, but audio data is not part of the publicly available dataset.

Getting the data Registration form and download available: [here].

Citation
M. Mirtchouk, D. Lustig, A. Smith, I. Ching, M. Zheng, and S. Kleinberg. Recognizing Eating from Body-Worn Sensors: Combining Free-living and Laboratory Data. IMWUT, 1(3), 2017.

License Use of the data is permitted for non-commercial research and education purposes provided you 1) properly credit the data source (citation information above), 2) do not attempt to identify participants in the study, and 3) do not redistribute the data (with or without modification).

ACE dataset (lab)

ACE 1.0 from Pervasive Health 2016 paper "Multimodality Sensing for Eating Recognition"
ACE 1.1 from UbiComp 2016 paper "Automated Estimation of Food Type and Amount Consumed from Body-worn Audio and Motion Sensors"
ACE 1.2 All of the above, plus audio data

Basic features 78 hours of data for 7 participants in two 6-hour sessions each, eating and doing other daily activities while wearing Google Glass, an earbud microphone, and a smartwatch on each wrist. Includes ground truth at the level of chews and swallows annotated from video, and annotation of type and amount of food and drink consumed in each intake.

Getting the data Registration form and download available: [here].

Citation
C. Merck, C. Maher, M. Mirtchouk, M. Zheng, Y. Huang, and S. Kleinberg. Multimodality Sensing for Eating Recognition. In Pervasive Health, 2016.

If you use the food weight and type annotations, also cite: M. Mirtchouk, C. Merck, and S. Kleinberg. Automated Estimation of Food Type and Amount Consumed from Body-worn Audio and Motion Sensors. In UbiComp, 2016.

Updates
ACE 1.1: Annotations of food type and amount consumed in each intake have been added.
ACE 1.2: Audio, with speech redacted, has been added.

License Use of the data is permitted for non-commercial research and education purposes provided you 1) properly credit the data source (citation information above), 2) do not attempt to identify participants in the study, and 3) do not redistribute the data (with or without modification).

GLEAM dataset

From Pervasive Health 2015 paper "Unintrusive Eating Recognition using Google Glass"

Basic features 2 hours of high resolution activity data for each of 38 participants as they walk, talk, and eat meals, collected with Google Glass.

Getting the data The data along with a more detailed README, and annotation of activities are available at: [GLEAM.tar.gz]

Citation Rahman, S. A., Merck, C., Huang, Y. and Kleinberg, S. (2015). Unintrusive Eating Recognition using Google Glass. Pervasive Health.

License The data can be used for any non-commercial purpose, as long as you do not redistribute the data (original or modified) and properly attribute its source. You also may not attempt to identify participants in the study.