Human Generated Data

Title

World's First Fully Automated Restaurant (Seventh of eight): A tray of automatically prepared food leaves the kitchen en route to the dining room by conveyor belt after the individual food items were collected by order boys (background) and placed on a tray with the check. The tray is carried by carhops to the cars.

Date

June 1966

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.61

Human Generated Data

Title

World's First Fully Automated Restaurant (Seventh of eight): A tray of automatically prepared food leaves the kitchen en route to the dining room by conveyor belt after the individual food items were collected by order boys (background) and placed on a tray with the check. The tray is carried by carhops to the cars.

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

June 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.61

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Bakery 99.7
Shop 99.7
Person 90.9
Human 90.9
Person 90.3
Creme 88.8
Cake 88.8
Food 88.8
Icing 88.8
Dessert 88.8
Cream 88.8
Ice Cream 88.3
Mammal 80.9
Animal 80.9
Pet 80.9
Cat 80.9
Sweets 64.5
Confectionery 64.5
Deli 55

Clarifai
created on 2019-03-22

people 99.6
furniture 98
adult 98
room 97.3
indoors 97.1
monochrome 95.8
woman 95
food 94.4
two 94.2
one 93
restaurant 92.5
table 92.5
chair 92.4
tableware 92.1
portrait 91.7
kitchenware 91.5
facial expression 88.1
child 86.6
vintage 86.3
old 85.3

Imagga
created on 2019-03-22

room 35.8
interior 35.4
table 32.1
home 30.3
case 24.8
kitchen 24.6
food 24
furniture 23.3
dinner 22.6
glass 20.6
shop 20.3
indoors 20.2
meal 18.7
washbasin 18.7
restaurant 17.3
house 15.9
luxury 15.4
basin 14.9
drink 14.2
vessel 13.9
decoration 13.7
breakfast 13.7
plate 13.5
cup 13.4
decor 13.3
mercantile establishment 13.2
coffee 13
service 13
wedding 12.9
cabinet 12.7
barbershop 12.6
setting 12.5
dining 12.4
smiling 12.3
people 12.3
chair 12.1
party 12
elegant 12
person 11.9
catering 11.7
cutlery 11.7
napkin 11.7
container 11.6
window 11.5
lunch 11.1
event 11.1
wine 11.1
inside 11
banquet 11
wood 10.8
reception 10.8
happy 10.6
china cabinet 10.6
sitting 10.3
domestic 10.2
flower 10
dish 10
furnishing 9.9
style 9.6
knife 9.6
fork 9.6
hotel 9.5
wall 9.4
child 9.3
eating 9.2
adult 9.1
toilet 9
bowl 9
family 8.9
place of business 8.9
silverware 8.8
tablecloth 8.8
lifestyle 8.7
formal 8.6
counter 8.5
modern 8.4
eat 8.4
bathroom 8.3
cook 8.2
celebration 8
design 7.9
cooking 7.9
smile 7.8
china 7.8
black 7.8
serve 7.8
arrangement 7.7
old 7.7
fine 7.6
healthy 7.6
tea 7.5
clean 7.5
equipment 7.5
fun 7.5
one 7.5
place 7.4
glasses 7.4
retro 7.4
kitchenware 7.4
shelf 7.3
utensil 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

indoor 98.6
counter 53.9
kitchen appliance 18.5
art 18.5
food 8.6
miniature 6.3
museum 5.4
design 4.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 99.7%
Sad 1.6%
Surprised 2.1%
Calm 83.5%
Happy 8.4%
Confused 0.7%
Disgusted 2.7%
Angry 1%

AWS Rekognition

Age 26-43
Gender Female, 51.8%
Disgusted 45.4%
Calm 46.1%
Surprised 45.2%
Happy 45.2%
Sad 52.3%
Confused 45.3%
Angry 45.7%

AWS Rekognition

Age 26-43
Gender Female, 50%
Angry 45.4%
Calm 51.2%
Disgusted 45.4%
Confused 45.4%
Sad 46%
Happy 45.5%
Surprised 46%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 90.9%
Ice Cream 88.3%
Cat 80.9%

Categories

Imagga

interior objects 88.1%
food drinks 10.3%